An Efficient Markov Chain Monte Carlo Method for Distributions with Intractable Normalising Constants

نویسندگان

  • K. K. Berthelsen
  • Jesper Møller
  • A. N. Pettitt
  • R. W. Reeves
چکیده

We present new methodology for drawing samples from a posterior distribution when (i) the likelihood function or (ii) a part of the prior distribution is only specified up to a normalising constant. In the case (i), the novelty lies in the introduction of an auxiliary variable in a Metropolis-Hastings algorithm and the choice of proposal distribution so that the algorithm does not depend upon the unknown normalising constant. In the case (ii), similar ideas apply and the situation is even simpler as no auxiliary variable is required. Our method is “on-line” as compared with alternative approaches to the problem which require “off-line” computations. Since it is needed to simulate from the “unknown distribution”, e.g. the likelihood function in case (i), perfect simulation such as the Propp-Wilson algorithm becomes useful. We illustrate the method in case (i) by producing posterior samples when the likelihood is given by an Ising model and by a Strauss point process.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Miscellanea An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants

Maximum likelihood parameter estimation and sampling from Bayesian posterior distributions are problematic when the probability density for the parameter of interest involves an intractable normalising constant which is also a function of that parameter. In this paper, an auxiliary variable method is presented which requires only that independent samples can be drawn from the unnormalised densi...

متن کامل

A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants

Simulating from distributions with intractable normalizing constants has been a long-standing problem in machine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo...

متن کامل

Efficient computational strategies for doubly intractable problems with applications to Bayesian social networks

Powerful ideas recently appeared in the literature are adjusted and combined to design improved samplers for doubly intractable target distributions with a focus on Bayesian exponential random graph models. Different forms of adaptive Metropolis–Hastings proposals (vertical, horizontal and rectangular) are tested and merged with the delayed rejection (DR) strategy with the aim of reducing the v...

متن کامل

Continuously Tempered Hamiltonian Monte Carlo

Hamiltonian Monte Carlo (HMC) is a powerful Markov chain Monte Carlo (MCMC) method for performing approximate inference in complex probabilistic models of continuous variables. In common with many MCMC methods, however, the standard HMC approach performs poorly in distributions with multiple isolated modes. We present a method for augmenting the Hamiltonian system with an extra continuous tempe...

متن کامل

Advances in Markov chain Monte Carlo methods

Probability distributions over many variables occur frequently in Bayesian inference, statistical physics and simulation studies. Samples from distributions give insight into their typical behavior and can allow approximation of any quantity of interest, such as expectations or normalizing constants. Markov chain Monte Carlo (MCMC), introduced by Metropolis et al. (1953), allows sampling from d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004